A Batch, Derivative-Free Algorithm for Finding Multiple Local Minima

نویسندگان

  • Jeffrey Larson
  • Stefan M. Wild
چکیده

We propose a derivative-free algorithm for finding high-quality local minima for functions that require significant computational resources to evaluate. Our algorithm efficiently utilizes the computational resources allocated to it and also has strong theoretical results, almost surely starting a finite number of local optimization runs and identifying all local minima. We propose metrics for measuring how efficiently an algorithm finds local minima, and we benchmark our algorithm on synthetic problems (with known local minima) and two real-world applications.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

THIRD-ORDER AND FOURTH-ORDER ITERATIVE METHODS FREE FROM SECOND DERIVATIVE FOR FINDING MULTIPLE ROOTS OF NONLINEAR EQUATIONS

In this paper, we present two new families of third-order and fourth-order methods for finding multiple roots of nonlinear equations. Each of them requires one evaluation of the function and two of its first derivative per iteration. Several numerical examples are given to illustrate the performance of the presented methods.    

متن کامل

Global Convergence of Radial Basis Function Trust-Region Algorithms for Derivative-Free Optimization

We analyze globally convergent, derivative-free trust-region algorithms relying on radial basis function interpolation models. Our results extend the recent work of Conn, Scheinberg, and Vicente [SIAM J. Optim., 20 (2009), pp. 387–415] to fully linear models that have a nonlinear term. We characterize the types of radial basis functions that fit in our analysis and thus show global convergence ...

متن کامل

A Derivative-Free Filter Driven Multistart Technique for Global Optimization

A stochastic global optimization method based on a multistart strategy and a derivative-free filter local search for general constrained optimization is presented and analyzed. In the local search procedure, approximate descent directions for the constraint violation or the objective function are used to progress towards the optimal solution. The algorithm is able to locate all the local minima...

متن کامل

A Derivative-Free Trust-Region Algorithm for the Optimization of Functions Smoothed via Gaussian Convolution Using Adaptive Multiple Importance Sampling

In this paper we consider the optimization of a functional F defined as the convolution of a function f with a Gaussian kernel. This type of objective function is of interest in the optimization of the expensive output of complex computational simulations, which often present some form of deterministic noise and need to be smoothed for the results to be meaningful. We introduce a derivative-fre...

متن کامل

Global Convergence of Radial Basis Function Trust Region Derivative-Free Algorithms

We analyze globally convergent derivative-free trust region algorithms relying on radial basis function interpolation models. Our results extend the recent work of Conn, Scheinberg, and Vicente to fully linear models that have a nonlinear term. We characterize the types of radial basis functions that fit in our analysis and thus show global convergence to first-order critical points for the ORB...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015